Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support of GGUF as a input format for LLM #1885

Open
wants to merge 52 commits into
base: master
Choose a base branch
from

Conversation

AlexKoff88
Copy link
Collaborator

No description provided.

@github-actions github-actions bot removed the category: WWB PR changes WWB label Mar 11, 2025
@AlexKoff88 AlexKoff88 changed the title [DRAFT]: Add support of instantiating model from GGUF. [DRAFT]: Add support of GGUF as a input format for LLM Mar 12, 2025
@github-actions github-actions bot added the category: samples GenAI samples label Mar 13, 2025
@github-actions github-actions bot added category: continuous batching Continuous batching category: LLM LLM pipeline (stateful, static) labels Mar 26, 2025
@AlexKoff88 AlexKoff88 changed the title [DRAFT]: Add support of GGUF as a input format for LLM Add support of GGUF as a input format for LLM Mar 26, 2025
return [PipelineType.STATEFUL, PipelineType.PAGED_ATTENTION, PipelineType.SPECULATIVE_DECODING, PipelineType.PROMPT_LOOKUP_DECODING]

def get_gguf_pipeline_types():
return [PipelineType.STATEFUL, PipelineType.PAGED_ATTENTION]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

after #1976 is merged, we will check whether GGUF model (modeling) can be converted to PA-based model.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please, update to latest master. The PR was merged

@ilya-lavrenov ilya-lavrenov added this to the 2025.2 milestone Mar 26, 2025
@ilya-lavrenov ilya-lavrenov self-assigned this Mar 26, 2025
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please, remove all changes in samples folder.

Copy link
Collaborator Author

@AlexKoff88 AlexKoff88 Mar 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean to remove this sample?

return [PipelineType.STATEFUL, PipelineType.PAGED_ATTENTION, PipelineType.SPECULATIVE_DECODING, PipelineType.PROMPT_LOOKUP_DECODING]

def get_gguf_pipeline_types():
return [PipelineType.STATEFUL, PipelineType.PAGED_ATTENTION]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please, update to latest master. The PR was merged

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: cmake / build Cmake scripts category: continuous batching Continuous batching category: LLM LLM pipeline (stateful, static) category: samples GenAI samples no-match-files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants